On Normalized Mutual Information: Measure Derivations and Properties
نویسنده
چکیده
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided.
منابع مشابه
Generalized Partial Volume: An Inferior Density Estimator to Parzen Windows for Normalized Mutual Information
Mutual Information (MI) and normalized mutual information (NMI) are popular choices as similarity measure for multimodal image registration. Presently, one of two approaches is often used for estimating these measures: The Parzen Window (PW) and the Generalized Partial Volume (GPV). Their theoretical relation has so far been unexplored. We present the direct connection between PW and GPV for NM...
متن کاملHigh-Dimensional Normalized Mutual Information for Image Registration Using Random Lines
Mutual information has been successfully used as an effective similarity measure for multimodal image registration. However, a drawback of the standard mutual information-based computation is that the joint histogram is only calculated from the correspondence between individual voxels in the two images. In this paper, the normalized mutual information measure is extended to consider the corresp...
متن کاملNormalized Conditional Possibility Distributions and Informational Connection Between Fuzzy Variables
In this article, we introduce a general concept of fuzzy operators. These operators are then used to generalize the possibilistic conditioning formulation proposed by Nguyen [1]. This generalization depends on the relation which exists between this conditioning and the probabilistic t-norm. By using other t-norms, other conditionings are obtained, and their properties are studied. One applicati...
متن کاملNormalized similarity measures for medical image registration
Two new similarity measures for rigid image registration, based on the normalization of Jensen’s difference applied to Rényi and Tsallis-Havrda-Charvát entropies, are introduced. One measure is normalized by the first term of Jensen’s difference, which in our proposal coincides with the marginal entropy, and the other by the joint entropy. These measures can be seen as an extension of two measu...
متن کاملA revisit to evaluating accuracy of community detection using the normalized mutual information
Normalized Mutual Information (NMI) has been widely used to evaluate accuracy of community detection algorithms. In this notes we show that NMI is seriously affected by systematic error due to finite size of networks, and may give wrong estimate of performance of algorithms in some cases. A simple expression for the estimate of this error is derived and tested numerically. We suggest to use a n...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Entropy
دوره 19 شماره
صفحات -
تاریخ انتشار 2017